Introduction to Android sensors
Difficulty Level:
Tags other☁android☁opensignals mobile☁android sensor basics

The OpenSignals mobile application ( Google Play link ) allows to acquire data from the internal sensors that are built into hardware of an Android smartphone.

In this Jupyter Notebook we will have a detailed look at the sensor types that are part of the Android system. We will explore how the Android system acquires data for each sensor and delve into the limitations of the Android system when acquiring data. The focus of this Jupyter Notebook will be the data returned by the OpenSignals mobile application and the Android system.

Thus, the functions used here will not always be explained in detail. If you want to have a more detailed look on the functions used, you can always visit our GitHub biosignalsnotebooks repository . We will also provide direct links to each function used. If some context on the functions behaviour is needed, it will be carefully given.

In case you want to have a more in depth look on how Android handles their sensors, then you can have a look at their developers page or their source page on Android sensors.


1 - Package imports

First, lets import some useful libraries that will be used for visualisation purposes.

In [1]:
# biosignalsnotebooks package
import biosignalsnotebooks as bsnb

# package for using operating system dependent functionality
import os

# numpy package
from numpy import cumsum

2 - How the Android operating system acquires sensor data and its limitations

The Android system acquires sensor data through an event based system. This means that the operation system waits for an event to happen and when the event occurs, the system picks up on it and handles it accordingly. The event structure used for acquiring sensor data is called a sensor event . One major limitation of these sensor events is that these may not occur at fixed time intervals. The reason for this is that the Android operating system tries to optimize the battery consumption of the phone and thus only registers these events when it is really needed. Thus, the data acquired from internal Android sensor with the OpenSignals mobile application may most likely not be sampled equidistantly.

The influence of the Android operating system on the sampling capacity of its internal sensors also has other implications. Mainly, that when setting a sampling rate for the Android sensors in the OpenSignals mobile application the system only uses this rate as a suggestion but may sample at lower or higher rates. When other applications, that make use of internal sensors, are running at the same time, the sampling capacity may also be altered as well. Additionally, due to the event based system, multiple sensors are not necessarily sampled at the same time instant. This has the effect that sensors may start and stop recording at different times. In order to know when the Android system picked up on a sensor event , a timestamp is associated with each event. This timestamp is the elapsed time since the last boot of the phone. The time is counted in nanoseconds.

Other aspects such as the hardware of a phone need to be accounted as well. Android supports a variety of different phone manufactures. This has the consequence that each phone model released by a manufacturer may include different sensor types with their particular specifications. Therefore, doing a recording with two distinct phones may result in different data recordings.

Summing it up, we can thus state the following important facts:

  • Android sensor data may not be sampled equidistantly;

  • The sampling rate set in OpenSignals mobile application may be altered by the Android operating system to lower or higher rates;

  • When acquiring data from multiple sensors at once the system does not sample all sensors at the same time;

  • Each sensor event is associated with a timestamp. The timestamp is based on the time elapsed (in nanoseconds) since the last boot of the phone;

  • The amount of internal sensor available and their recorded data may differ between phone models.

These facts, should always be kept in mind when acquiring data from Android sensors. However, some of these limitations can be overcome with some simple post-processing steps. For instance, the non-equidistant sampling can be re-aligned through a resampling of the data. We show how this is achieved in our Resampling of signals recorded with Android sensors notebook about resampling signals recorded with Android sensors.

3 - The phones coordinate system

The Android system uses a standard 3-axis coordinate system to define its position. Most sensors use the coordinate system as shown in the image on the right. Assuming the phone is held in the hand in a parallel line with the face, while the screen is facing towards the face, The directions of the three axes are defined as follows:

  • X-axis: Is horizontal and points to the right;

  • Y-axis: Is vertical and points upwards towards the sky;

  • Z-axis: Is perpendicular to the screen and points towards the users face.

4 - Android sensor reporting modes

Android defines four types of reporting modes for their sensors. These are:

  • Continuous: Events are reported at a constant rate as long as the Android system does not alter the rate:

  • On Change: Events are reported only when the value changes;

  • Special Trigger: Events are reported as described in the description of the sensor. The rate set might not have an impact on the rate of event delivery;

  • One Shot: Events are reported in one-shot mode. Upon detection of an event, the sensor deactivates itself and then sends a single event.

Depending on their reporting mode, the sensors express different sampling behaviours. Most Android sensors are continuous , however we will describe the reporting mode of each sensor in a later section.

5 - Loading the data, getting useful information and plotting the sensor timeline

Let us have a look at some data acquired from Android sensors using the OpenSignals mobile application . The data shown in this and the following sections has been acquired using a Xiaomi Mi A1 while taking a walk outside. The sampling rate was set to 100 Hz .

5.1 - Loading the data and getting data report

The facts stated in the previous sections can be easily affirmed by having a look at the data. Thus, before we are going to explore the details of each sensor, let us first load the data using the load_android_data(...) function.
The report that is generated by the function already shows some of the limitations stated above. For instance, we can see that all sensors sample on average either below or slightly below 100 Hz (an average sampling rate is displayed since the Android system does not sample equidistantly). The GPS , Light , Proximity and Signficant Motion sensors sample far below 100 Hz. The average sampling rate of the Significant Motion sensor was set to zero because it only acquired one sample. Furthermore, we can observe that the timestamps for when the sensors started and ended recording are all different.

In [2]:
# set file path
path = '../../images/other/intro_to_android_sensors/'

# get a list with all the files within that folder
file_list = os.listdir(path)

# make full path for each file
file_list = [path + file for file in file_list if ".txt" in file]

# load the sensor data and print a data report
sensor_data, report = bsnb.load_android_data(file_list, print_report=True)
names: ['Acc', 'AccUnc', 'GameRot', 'GeoMagRot', 'GPS', 'Grav', 'Gyr', 'GyrUnc', 'Light', 'LinAcc', 'Mag', 'MagUnc', 'Proximity', 'Rot', 'SigMotion', 'Steps', 'Detected']

number of samples: [29499, 29499, 29496, 14640, 225, 29497, 29497, 29497, 285, 29497, 14644, 14644, 3, 29497, 1, 290, 546]

starting times: [1465354449816399.0, 1465354449816399.0, 1465354465799108.0, 1465354526097131.0, 1465356754403127.0, 1465354455790085.0, 1465354455790085.0, 1465354455790085.0, 1465354542698694.0, 1465354465799108.0, 1465354425592308.0, 1465354425592308.0, 1465354576786829.0, 1465354465799108.0, 1465363412423127.0, 1465354466613492.0, 1465355854344202.0]

stopping times: [1465649345498891.0, 1465649345498891.0, 1465649341501088.0, 1465649358713002.0, 1465648511280828.0, 1465649341501088.0, 1465649341501088.0, 1465649341501088.0, 1465648996103139.0, 1465649341501088.0, 1465649358713002.0, 1465649358713002.0, 1465450249350135.0, 1465649341501088.0, 1465363412423127.0, 1465648060861440.0, 1465648060861440.0]

avg. sampling rates: [100.0285923168788, 100.0285923168788, 100.0251963859691, 49.65190149248988, 0.7677625349060699, 100.02519247092283, 100.02519247092283, 100.02519247092283, 0.9644989519998823, 100.02858764538209, 49.64854393275299, 49.64854393275299, 0.02090463483875917, 100.02858764538209, 0, 0.984351709953072, 1.8651192490552893]

min. sampling rate: 0.0

max. sampling rate: 100.0285923168788

mean sampling rate: 56.10392706835343

std. sampling rate: 44.67806852718664

starting order: ['Mag', 'MagUnc', 'Acc', 'AccUnc', 'Grav', 'Gyr', 'GyrUnc', 'GameRot', 'LinAcc', 'Rot', 'Steps', 'GeoMagRot', 'Light', 'Proximity', 'Detected', 'GPS', 'SigMotion']

stopping order: ['SigMotion', 'Proximity', 'Detected', 'Steps', 'GPS', 'Light', 'GameRot', 'Grav', 'Gyr', 'GyrUnc', 'LinAcc', 'Rot', 'Acc', 'AccUnc', 'GeoMagRot', 'Mag', 'MagUnc']

5.2 - Plotting the sensor acquisition timeline

In case we want to have visual insight into when each sensor acquired its samples, we can use the plot_android_sensor_timeline(...) . In the process of generating the data to be plotted, the function automatically shifts the axis to start at zero and converts the time axes of all sensors to seconds.

This function takes the following inputs:

  • sensor_data (list): A list containing the Android sensor data (including the time axis). The list can be obtained by calling the load_android_data(...) function;

  • report (dict): A dictionary containing information on the sensors. The dictionary can be obtained by calling the load_android_data(...) function;

  • plot_until_seconds (int or float, optional): Int or float indicating how many seconds of the timeline should be plotted. The value can be either -1 for plotting the entire timeline or a value > 0. If not specified, then -1 is used;

  • line_thickness (float, optional): Float indicating the thickness of the timeline lines. If not specified a thickness of 1 is used.

For our purpose, we will only plot the first 10 seconds of the data. The line thickness is set to 1.5 in order to properly distinguish most of the lines from each other.
Looking at the plot generated by the function, we can affirm that the system does not sample with a fixed rate. It is also again visible that the sensors start recording at different times. Additionally, the plot already gives us a hint on what kind of reporting mode each sensor has. The Accelerometer sensor, for example, is most likely a Continuous sensor while the Light sensor probably is an On Change sensor.

In [3]:
bsnb.plot_android_sensor_timeline(sensor_data, report, plot_until_seconds=10, line_thickness=1.5)

6 - A closer look at the individual sensors

Android divides their sensors into three major categories. We added a fourth one in which we currently place the GPS sensor. These are:

  • Motion Sensors

  • Position Sensors

  • Environment Sensors

  • Special Sensors

In the following sections we will explore the sensors within each category and show what kind of data each sensor records. For each presented sensor a plot of the data will be shown if it is appropriate. Since we do not want to overcrowd the plots with data, only a single channel of the data (in some cases two channels) will be plotted. For sensors, where a plotting of the data does not make to much sense, the ".txt" file is presented.

To have the same time axis as in the sensor acquisition timeline plotted above, we will shift the time axes of each sensor according to the timestamp of the sensor that first started recording and convert the time from nanoseconds to seconds.

In [4]:
# get the earliest timestamp of the entire recording
start_time = min(report['starting times'])

# make a copy of the sensor_data list
shifted_sensor_data = sensor_data.copy()

# cycle through the sensor data list
for data in shifted_sensor_data:
    
    # check the dimensionality of the data (the significant motion sensor, for example is one dimensional)
    if(data.ndim == 1):
        
        # get the time axis
        time_axis = data[:1]

    else:  # multidimensionl array

        # get the time axis
        time_axis = data[:, 0]
    
    # shift time axis to start at zero and convert to seconds
    time_axis = time_axis - start_time
    time_axis = time_axis * 1e-9
    
    # override the time_axis in the data array
    if(data.ndim == 1):
        
        data[:1] = time_axis
    
    else:
        
        data[:,0] = time_axis

6.1 - Motion Sensors

Motion sensors can be used to track the movement of the device. These may include such movements like tilt, shake, rotation or swing. Depending on which sensor is used, either the motion relative to device"s coordinate system or the motion relative to the world"s coordinate system is measured.

The sensors that are part of this category are:

Accelerometer (Continuous):
The accelerometer measures the acceleration force in m/s 2 , including the force of gravity, that is applied to a device on all three physical axes (x, y, and z).

In [5]:
# get acc data
acc = shifted_sensor_data[report['names'].index('Acc')]

# plot x-axis
bsnb.plot([acc[:,0]], [acc[:,1]], legend_label=["x-axis"], y_axis_label=["Accelerometer"], x_axis_label="Time (s)")

Uncalibrated Accelerometer (Continuous):
The uncalibrated accelerometer measures the acceleration in m/s 2 along all three physical axes (x, y, and z) without bias compensation (factory bias and temperature compensation are applied to uncalibrated measurements). Additionally, it provides a bias estimation. This means that the uncalibrated accelerometer has a total of six data channels.

In [6]:
# get acc uncalibrated data
unc_acc = shifted_sensor_data[report['names'].index('AccUnc')]

# plot x-axis
bsnb.plot([unc_acc[:,0]], [unc_acc[:,1]], legend_label=["x-axis"], y_axis_label=["Uncalibrated Accelerometer"], x_axis_label="Time (s)")

Linear Accelerometer (Continuous):
The linear accelerometer measures the acceleration in m/s 2 , excluding gravity, along all three physical axes (x, y, and z).

In [7]:
# get linear acc data
lin_acc = shifted_sensor_data[report['names'].index('LinAcc')]

# plot x-axis
bsnb.plot([lin_acc[:,0]], [lin_acc[:,1]], legend_label=["x-axis"], y_axis_label=["Linear Accelerometer"], x_axis_label="Time (s)")

Gravity (Continuous):
The gravity sensor measures the force of gravity in m/s 2 that is applied to a device along all three physical axes (x, y, z).

In [8]:
# get gravity data
grav = shifted_sensor_data[report['names'].index('Grav')]

# plot x-axis
bsnb.plot([grav[:,0]], [grav[:,1]], legend_label=["x-axis"], y_axis_label=["Gravity Sensor"], x_axis_label="Time (s)")

Gyroscope (Continuous):
The gyroscope measures the device"s rate of rotation in rad/s around each of the three physical axes (x, y, and z).

In [9]:
# get gyroscope data
gyr = shifted_sensor_data[report['names'].index('Gyr')]

# plot x-axis
bsnb.plot([gyr[:,0]], [gyr[:,1]], legend_label=["x-axis"], y_axis_label=["Gyroscope"], x_axis_label="Time (s)")

Uncalibrated Gyroscope (Continuous):
The uncalibrated gyroscope measures the device"s rate of rotation in rad/s around each of the three physical axes (x, y, and z) without any drift compensation. Additionally, it provides the estimated drift for each axis. Thus, the the uncalibrated gyroscope has a total of six data channels.

In [10]:
# get gyroscope uncalibrated data
unc_gyr = shifted_sensor_data[report['names'].index('GyrUnc')]

# plot x-axis and estimated drift of axis
bsnb.plot([unc_gyr[:,0], unc_gyr[:,0]], [unc_gyr[:,1], unc_gyr[:,4]], 
           legend_label=["x-axis", "x-axis drift"], y_axis_label=["Uncalibrated Gyroscope", "Uncalibrated Gyroscope"], x_axis_label="Time (s)")

Rotation Vector (Continuous):
The rotation vector is a so called attitude composite sensor. This means that the sensor is based on a composition of other sensors. This sensor is derived from the accelerometer, magnetic field and gyroscope sensors. It defines the device"s orientation relative to an East-North-Up coordinate frame. In this frame the x-axis points east and is parallel to the ground, the y-axis is parallel to the ground as well and points north, and the z-axis is perpendicular to the ground pointing upwards to the sky.

The rotation of the phone is relative to this system and can be seen as rotating the phone by an angle $\theta$ around a rotation axis. The Android system provides the coordinates of this rotation as the four unit-less components (x, y, z, and w) of a unit quaternion, where w is the scalar component of that quaternion.

The components can be described as the following

  • Rotation vector component along the x-axis (x $\cdot$ sin($\theta$/2))

  • Rotation vector component along the y-axis (y $\cdot$ sin($\theta$/2))

  • Rotation vector component along the z-axis (z $\cdot$ sin($\theta$/2))

  • Scalar component of the rotation vector (cos($\theta$/2)

In case you never heard of a quaternion before, we recommend watching this introductory video .

In [11]:
# get rotation vector data
rot_vec = shifted_sensor_data[report['names'].index('Rot')]

# plot x-axis and estimated drift of axis
bsnb.plot([rot_vec[:,0]], [rot_vec[:,1]], 
           legend_label=["x-axis rotation"], y_axis_label=["Rotation Vector"], x_axis_label="Time (s)")

Significant Motion (One Shot):
Due to its One shot reporting mode, the significant motion sensor triggers an event each time a significant motion is detected. After triggering the event it disables itself. A significant motion is a motion that might lead to a change in the user"s location for example walking, biking, or sitting in a moving car.

In the data we recorded a significant motion was detected only once. This motion was registered when the person recording the data started walking.

In [12]:
# Embedding of .pdf file
from IPython.display import IFrame
IFrame(src="../../images/other/intro_to_android_sensors/opensignals_ANDROID_SIGNIFICANT_MOTION_2020-07-28_19-01-34.txt", width="100%", height="350")
Out[12]:

Step Counter (On Change):
The step counter counts the number of steps taken by the user since the last reboot while the sensor was activated. The step counter is only reset to zero when a system reboot is performed. The step counter usually has more latency (up to 10 seconds) but more accuracy than the step detector sensor.
In case you want to count the number of steps since the start of the acquisition, then just subtract the first value of the data array from all values present in that array, as shown below.

In [13]:
# get the data of the step counter
step_counter = sensor_data[-2]  # in our case the step_counter is the penulitmate value in the sensor_data list

# shift values to start at zero steps
steps_since_recording_start = step_counter[:,1] - step_counter[0, 1]
In [14]:
# plot step_counter and steps since beginning of recording
bsnb.plot([step_counter[:,0], step_counter[:,0]], [step_counter[:,1], steps_since_recording_start], 
           legend_label=["step counter", "steps since recording started"], y_axis_label=["Step Counter", "Step Counter"], x_axis_label="Time (s)")

Step Detector (Special Trigger):
Similar to the step counter, the step detector sensor triggers an event each time the user takes a step. However, instead of reporting the number of steps taken, the step counter just reports a 1.0 for each triggered event. The latency is expected to be below 2 seconds.
If you want to have a similar way of displaying as the step counter, you can simply calculate the cumulative sum over the data, as presented below.

In [15]:
# Embedding of .pdf file
from IPython.display import IFrame
IFrame(src="../../images/other/intro_to_android_sensors/opensignals_ANDROID_STEP_DETECTOR_2020-07-28_19-01-34.txt", width="100%", height="350")
Out[15]:
In [16]:
# get the data of step detector
step_detector = sensor_data[-1] # in our case the step detector is the last value in the sensor_data list

# get the values of the detector
trigger_vals = step_detector[:,1]

# calculate the cumulative sum (cast to int because otherwise the values are floats)
steps = cumsum(trigger_vals, dtype=int)

# print the first twenty steps
print('steps: {}'.format(steps[:20]))
steps: [ 1  2  3  4  5  6  7  8  9 10 11 12 13 14 15 16 17 18 19 20]

6.2 - Position Sensors

Position sensor can be used to track the phones position relative to the world"s coordinate system.

Game Rotation Vector (Continuous):
The game rotation vector is similar to the rotation vector presented above. The difference being that the sensor is only derived from a accelerometer and a gyroscope. Thus, the y-axis does not point north, but to some other reference instead. It has the same data fields as the rotation vector. The data of the sensor is unitless.

In [17]:
# get game rotation vector data
game_rot_vec = shifted_sensor_data[report['names'].index('GameRot')]

# plot x-axis
bsnb.plot([game_rot_vec[:,0]], [game_rot_vec[:,1]], 
           legend_label=["x-axis rotation"], y_axis_label=["Game Rotation Vector"], x_axis_label="Time (s)")

Geomagnetic Rotation Vector (Continuous):
As the name suggests, this sensor also has similarities to the rotation vector. The difference here is that the geomagnetic rotation vector is derived from an accelerometer and a magnetic field sensor. It has the same data fields as the rotation vector. The data of the sensor is unitless.

In [18]:
# get geomagnetic rotation vector data
geo_rot_vec = shifted_sensor_data[report['names'].index('GeoMagRot')]

# plot x-axis
bsnb.plot([geo_rot_vec[:,0]], [geo_rot_vec[:,1]], 
           legend_label=["x-axis rotation"], y_axis_label=["Geomagnetic Rotation Vector"], x_axis_label="Time (s)")

Magnetic Field (Continuous):
The magnetic field sensor measures the geomagnetic field strength in $\mu$T along all three physical axes (x, y, and z).

In [19]:
# get magnetometer data
mag = shifted_sensor_data[report['names'].index('Mag')]

# plot x-axis
bsnb.plot([mag[:,0]], [mag[:,1]], 
           legend_label=["x-axis"], y_axis_label=["Magnetic Field"], x_axis_label="Time (s)")

Uncalibrated Magnetic Field (Continuous):
The uncalibrated magnetic field sensor measures the geomagnetic field strength in $\mu$T along all three physical axes (x, y, and z). Additionally it also reports an estimation for the hard iron bias along each axis. This means that similar that the sensor has a total of six data channels.

In [20]:
# get uncalibrated magnetometer data
mag_unc = shifted_sensor_data[report['names'].index('MagUnc')]

# plot x-axis and the hard iron bias of the x-axis
bsnb.plot([mag_unc[:,0], mag_unc[:,0]], [mag_unc[:,1], mag_unc[:,4]], 
           legend_label=["x-axis", "x-axis hard iron bias"], y_axis_label=["Uncalibrated Magnetic Field", "Uncalibrated Magnetic Field"], x_axis_label="Time (s)")

Proximity (On Change):
The proximity sensor reports the distance from the sensor to the closest visible surface in cm. Depending on the phone model you are using the sensor either reports a continuous spectrum of distance values or a two value spectrum consisting of a value for close and a value for far . The phone that we used only reports two values, as can be seen in the ".txt" file of the sensor.

Since the sensor is an on change sensor, the sensor will only trigger an event when there is a significant change in distance. this means, that while the phone is situated in a pocket, on a surface or not moving, the sensor will not report any new values. This behaviour can also be seen in the ".txt" file of the sensor. The sensor only registered a change in distance three times.

In [21]:
# Embedding of .pdf file
from IPython.display import IFrame
IFrame(src="../../images/other/intro_to_android_sensors/opensignals_ANDROID_PROXIMITY_2020-07-28_19-01-34.txt", width="100%", height="350")
Out[21]:

6.3 - Environment Sensors

Environment sensors provide insight on the environment in which the user (or the phone) is located.

Ambient Temperature (On Change):
The ambient temperature sensor measures, as the name suggest, the ambient temperature of the location in which the user (or phone) is situated. Unfortunately, we can not provide any data on this sensor, since the phone used does not support this sensor. However, due to its declaration as a sensor with an On Change reporting mode, we can deduce that it has most likely a similar behavior like the proximity sensor or any other On Change sensor.

Light (On Change):
The light sensor measures the illuminance of the environment in lux. Due to its On Change reporting mode, the sensor is only triggered when there is a significant change in lighting (i.e. when the user pulls the phone out of the pocket, the user changes from a brightly lit environment into more darker environment, etc.).

In [22]:
# get magnetometer data
light = shifted_sensor_data[report['names'].index('Light')]

# plot x-axis
bsnb.plot([light[:,0]], [light[:,1]], 
           legend_label=["Illuminance"], y_axis_label=["Light Sensor"], x_axis_label="Time (s)")

Pressure (Continuous):
The pressure sensor measures reports the atmospheric pressure in hPa (hectopascal). Unfortunately, we can not provide any data on this sensor, since the phone used does not support this sensor. However, due to its declaration as a sensor with an Continuous reporting mode, we can deduce that it has most likely a similar behaviour like the accelerometer sensor or any other Continuous sensor.

Relative Humidity (On Change):
The relative humidity sensor measures the relative ambient humidity and returns its value as a percentage. Unfortunately, as declared for Pressure sensor we can not provide any data on this sensor, since the phone used does not support this sensor. However, due to its declaration as a sensor with an On Change reporting mode, we can deduce that it has most likely a similar behavior like the proximity sensor or any other On Change sensor.

6.4 - Special sensors

In this category we place the GPS sensor. In future versions of the OpenSignals mobile application we also plan to include the audio and camera/video sensors of an Android smartphone. These will also then be placed into this category. Android does not provide the reporting mode types for these sensors, however it can be assumed that these sensor are Continuous with their own system dependent sampling rates.

GPS (Continuous):
The GPS sensor provides information on the position of the user (or phone). This position information is given as the latitude and longitude. Some phone models also provide the altitude at which the user (or phone) currently is. If your phone does not support an altitude measurement, the third channel will be zero. The ".txt" file of the GPS sensor is shown below.

In [23]:
# Embedding of .pdf file
from IPython.display import IFrame
IFrame(src="../../images/other/intro_to_android_sensors/opensignals_ANDROID_GPS_2020-07-28_19-01-34.txt", width="100%", height="350")
Out[23]:

In this Jupyter notebook we learned the basics of internal Android sensor data that has been acquired with the OpenSignals mobile application . We discussed the limitations of the Android operating system concerning data acquisition and had a detailed look at each sensor that is supported by Android.

We hope that you have enjoyed this guide . biosiganlsnotebooks is an environment in continuous expansion, so don"t stop your journey and learn more with the remaining Notebooks .

In [24]:
from biosignalsnotebooks.__notebook_support__ import css_style_apply
css_style_apply()
.................... CSS Style Applied to Jupyter Notebook .........................
Out[24]: